RMSProp is an optimizer intended to address AdaGrad’s tendency to shrink the learning rate to zero. RMSProp adds an additional parameter
Recall that AdaGrad can be written as
To this, RMSProp adds scaling terms
The update rule for
where again
1 min read
RMSProp is an optimizer intended to address AdaGrad’s tendency to shrink the learning rate to zero. RMSProp adds an additional parameter
Recall that AdaGrad can be written as
To this, RMSProp adds scaling terms
The update rule for
where again